20,797 research outputs found

    Do Finite-Size Lyapunov Exponents Detect Coherent Structures?

    Full text link
    Ridges of the Finite-Size Lyapunov Exponent (FSLE) field have been used as indicators of hyperbolic Lagrangian Coherent Structures (LCSs). A rigorous mathematical link between the FSLE and LCSs, however, has been missing. Here we prove that an FSLE ridge satisfying certain conditions does signal a nearby ridge of some Finite-Time Lyapunov Exponent (FTLE) field, which in turn indicates a hyperbolic LCS under further conditions. Other FSLE ridges violating our conditions, however, are seen to be false positives for LCSs. We also find further limitations of the FSLE in Lagrangian coherence detection, including ill-posedness, artificial jump-discontinuities, and sensitivity with respect to the computational time step.Comment: 22 pages, 7 figures, v3: corrects the z-axis labels of Fig. 2 (left) that appears in the version published in Chao

    On the Application of Genetic Algorithms to Differential Equations

    Get PDF
    Genetic algorithms can be used in order to solve optimization problems. Such a technique may be used in order to solve differential equations.differential equations; genetic algorithms;

    Polynomial Interpolation and Applications to Autoregressive Models

    Get PDF
    Polynomial interpolation can be used to approximate functions and their derivatives. Some autoregressive models can be stated by using polynomial interpolation and function approximation.polynomial interpolation; autoregressive models;

    On the Implementation and Use of a Genetic Algorithm with Genetic Acquisitions

    Get PDF
    A genetic algorithm is convergent when genetic mutations occur on the objective function gradient direction. These genetic mutations are called genetic acquisitions (Mateescu, 2005). We improved the algorithm and its implementation by using the characteristics of parents in order to generate new individuals. Finally, we applied the genetic algorithm in order to find the parameters of a Cobb-Douglas function.evolutionary algorithms, optimization

    Deep Learning for Real-time Gravitational Wave Detection and Parameter Estimation: Results with Advanced LIGO Data

    Full text link
    The recent Nobel-prize-winning detections of gravitational waves from merging black holes and the subsequent detection of the collision of two neutron stars in coincidence with electromagnetic observations have inaugurated a new era of multimessenger astrophysics. To enhance the scope of this emergent field of science, we pioneered the use of deep learning with convolutional neural networks, that take time-series inputs, for rapid detection and characterization of gravitational wave signals. This approach, Deep Filtering, was initially demonstrated using simulated LIGO noise. In this article, we present the extension of Deep Filtering using real data from LIGO, for both detection and parameter estimation of gravitational waves from binary black hole mergers using continuous data streams from multiple LIGO detectors. We demonstrate for the first time that machine learning can detect and estimate the true parameters of real events observed by LIGO. Our results show that Deep Filtering achieves similar sensitivities and lower errors compared to matched-filtering while being far more computationally efficient and more resilient to glitches, allowing real-time processing of weak time-series signals in non-stationary non-Gaussian noise with minimal resources, and also enables the detection of new classes of gravitational wave sources that may go unnoticed with existing detection algorithms. This unified framework for data analysis is ideally suited to enable coincident detection campaigns of gravitational waves and their multimessenger counterparts in real-time.Comment: 6 pages, 7 figures; First application of deep learning to real LIGO events; Includes direct comparison against matched-filterin

    A Note on likelihood estimation of missing values in time series

    Get PDF
    Missing values in time series can be treated as unknown parameters and estimated by maximum likelihood, or as random variables and predicted by the expectation of the unknown values given the data. The difference between these two procedures is illustrated by an example. It is argued that the second procedure is, in general, more relevant for estimating missing values in time series
    • …
    corecore